XClose

Gatsby Computational Neuroscience Unit

Home
Menu

Ingo Steinwart

 

Wednesday 28th February 2018

 

Time: 4.00pm

 

Ground Floor Seminar Room

25 Howland Street, London, W1T 4JG

 

Learning with (Flexible) Kernels

 

 

Reproducing kernel Hilbert spaces (RKHSs) play an important role in machine learning methods such as kernel mean embeddings, support vector machines (SVMs), and Gaussian processes. Analyzing these learning methods requires solid knowledge about the incorporated RKHSs and their interplay with probability measures. In the first part of my talk, I will discuss several aspects of this interplay. In the second part, I will try to explore the so far mostly unexploited flexibility of kernels: On the one hand, I will show that using a simple sum construction for locally defined kernels makes it possible to quickly train SVMs on millions of samples without sacrificing theoretical guarantees. On the other hand, I will discuss a class of hierarchical kernels whose structure mimics parts of deep neural network architectures.